Isserlis' Theorem
   HOME

TheInfoList



OR:

In
probability theory Probability theory is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expressing it through a set ...
, Isserlis' theorem or Wick's probability theorem is a formula that allows one to compute higher-order moments of the
multivariate normal distribution In probability theory and statistics, the multivariate normal distribution, multivariate Gaussian distribution, or joint normal distribution is a generalization of the one-dimensional ( univariate) normal distribution to higher dimensions. One ...
in terms of its covariance matrix. It is named after
Leon Isserlis Leon Isserlis (1881–1966) was a Russian-born British statistician known for his work on the exact distribution of sample moments, including Isserlis’ theorem. He also brought to the attention of British statisticians the work of Russi ...
. This theorem is also particularly important in
particle physics Particle physics or high energy physics is the study of fundamental particles and forces that constitute matter and radiation. The fundamental particles in the universe are classified in the Standard Model as fermions (matter particles) an ...
, where it is known as
Wick's theorem Wick's theorem is a method of reducing high-order derivatives to a combinatorics problem. It is named after Italian physicist Gian-Carlo Wick. It is used extensively in quantum field theory to reduce arbitrary products of creation and annihila ...
after the work of . Other applications include the analysis of portfolio returns, quantum field theory and generation of colored noise.


Statement

If (X_1,\dots, X_) is a zero-mean
multivariate normal In probability theory and statistics, the multivariate normal distribution, multivariate Gaussian distribution, or joint normal distribution is a generalization of the one-dimensional (univariate) normal distribution to higher dimensions. One d ...
random vector, then\operatorname ,X_1 X_2\cdots X_\,= \sum_\prod_ \operatorname ,X_i X_j\,= \sum_\prod_ \operatorname(\,X_i, X_j\,), where the sum is over all the pairings of \, i.e. all distinct ways of partitioning \ into pairs \, and the product is over the pairs contained in p. In his original paper,
Leon Isserlis Leon Isserlis (1881–1966) was a Russian-born British statistician known for his work on the exact distribution of sample moments, including Isserlis’ theorem. He also brought to the attention of British statisticians the work of Russi ...
proves this theorem by mathematical induction, generalizing the formula for the 4^ order moments, which takes the appearance : \operatorname ,X_1 X_2 X_3 X_4\,= \operatorname _1X_2,\operatorname _3X_4+ \operatorname _1X_3,\operatorname _2X_4+ \operatorname _1X_4,\operatorname _2X_3


Odd case

If n=2m+1 is odd, there does not exist any pairing of \. Under this hypothesis, Isserlis' theorem implies that\operatorname ,X_1 X_2\cdots X_\,= 0. This also follows from the fact that -X=(-X_1,\dots,-X_n) has the same distribution as X, which implies that \operatorname ,X_1 \cdots X_\,\operatorname ,(-X_1) \cdots (-X_)\,-\operatorname ,X_1 \cdots X_\,= 0.


Even case

If n=2m is even, there exist (2m)!/(2^m!) = (2m-1)!! (see double factorial) pair partitions of \: this yields (2m)!/(2^m!) = (2m-1)!! terms in the sum. For example, for 4^ order moments (i.e. 4 random variables) there are three terms. For 6^-order moments there are 3\times 5=15 terms, and for 8^-order moments there are 3\times5\times7 = 105 terms.


Proof

Let \Sigma_ = \operatorname(X_i, X_j) be the covariance matrix, so that we have the zero-mean
multivariate normal In probability theory and statistics, the multivariate normal distribution, multivariate Gaussian distribution, or joint normal distribution is a generalization of the one-dimensional (univariate) normal distribution to higher dimensions. One d ...
random vector (X_1, ..., X_n) \sim N(0, \Sigma) . Using quadratic factorization -x^T\Sigma^x/2 + v^Tx - v^T\Sigma v/2 = -(x-\Sigma v)^T\Sigma^(x-\Sigma v)/2, we get \frac\int e^ dx = e^ Differentiate under the integral sign with \partial_, _ to obtain E _1\cdots X_n= \partial_, _e^. That is, we need only find the coefficient of term v_1\cdots v_n in the Taylor expansion of e^. If n is odd, this is zero. So let n = 2m, then we need only find the coefficient of term v_1\cdots v_n in the polynomial \frac(v^T\Sigma v/2)^m. Expand the polynomial and count, we obtain the formula. \square


Generalizations


Gaussian integration by parts

An equivalent formulation of the Wick's probability formula is the Gaussian
integration by parts In calculus, and more generally in mathematical analysis, integration by parts or partial integration is a process that finds the integral of a product of functions in terms of the integral of the product of their derivative and antiderivative. ...
. If (X_1,\dots X_) is a zero-mean
multivariate normal In probability theory and statistics, the multivariate normal distribution, multivariate Gaussian distribution, or joint normal distribution is a generalization of the one-dimensional (univariate) normal distribution to higher dimensions. One d ...
random vector, then \operatorname(X_1 f(X_1,\ldots,X_n))=\sum_^ \operatorname(X_1X_i)\operatorname(\partial_f(X_1,\ldots,X_n)).The Wick's probability formula can be recovered by induction, considering the function f:\mathbb^n\to\mathbb defined by f(x_1,\ldots,x_n)=x_2\ldots x_n. Among other things, this formulation is important in Liouville Conformal Field Theory to obtain
conformal Ward's identities Conformal may refer to: * Conformal (software), in ASIC Software * Conformal coating in electronics * Conformal cooling channel, in injection or blow moulding * Conformal field theory A conformal field theory (CFT) is a quantum field theory th ...
, BPZ equations and to prove the Fyodorov-Bouchaud formula.


Non-Gaussian random variables

For non-Gaussian random variables, the moment-
cumulant In probability theory and statistics, the cumulants of a probability distribution are a set of quantities that provide an alternative to the '' moments'' of the distribution. Any two probability distributions whose moments are identical will have ...
s formula replaces the Wick's probability formula. If (X_1,\dots X_) is a vector of random variables, then \operatorname(X_1 \ldots X_n)=\sum_ \prod_ \kappa\big((X_i)_\big),where the sum is over all the
partitions Partition may refer to: Computing Hardware * Disk partitioning, the division of a hard disk drive * Memory partition, a subdivision of a computer's memory, usually for use by a single job Software * Partition (database), the division of a ...
of \, the product is over the blocks of p and \kappa\big((X_i)_\big) is the joint cumulant of (X_i)_.


See also

*
Wick's theorem Wick's theorem is a method of reducing high-order derivatives to a combinatorics problem. It is named after Italian physicist Gian-Carlo Wick. It is used extensively in quantum field theory to reduce arbitrary products of creation and annihila ...
*
Cumulant In probability theory and statistics, the cumulants of a probability distribution are a set of quantities that provide an alternative to the '' moments'' of the distribution. Any two probability distributions whose moments are identical will have ...
s *
Normal distribution In statistics, a normal distribution or Gaussian distribution is a type of continuous probability distribution for a real-valued random variable. The general form of its probability density function is : f(x) = \frac e^ The parameter \mu ...


References


Further reading

* {{DEFAULTSORT:Isserlis' Theorem Moment (mathematics) Normal distribution Probability theorems